PAC learning, VC dimension, and the arithmetic hierarchy

نویسنده

  • Wesley Calvert
چکیده

We compute that the index set of PAC-learnable concept classes is m-complete Σ 3 within the set of indices for all concept classes of a reasonable form. All concept classes considered are computable enumerations of computable Π 1 classes, in a sense made precise here. This family of concept classes is sufficient to cover all standard examples, and also has the property that PAC learnability is equivalent to finite VC dimension.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Higher dimensional PAC - learning and VC - dimension

The VC-dimension (Vapnic-Chervonenkis dimension) was introduced in 1970’s related to computational learning theory, combinatorics, and model theory which is a branch of mathematical logic. In fact, it is well known that for given class C, PAC-learnability of C, the finiteness of VC-dimension of C, and the dependency (which is a notion in model theory) of a formula defines C are essentially the ...

متن کامل

A generalization of the PAC learning in product probability spaces

Three notions dependent theory, VC-dimension, and PAC-learnability have been found to be closely related. In addition to the known relation among these notions in model theory, finite combinatorics and probability theory, Chernikov, Palacin, and Takeuchi found a relation between n-dependence and VCn-dimension, which are generalizations of dependence and VC-dimension respectively. We are now wor...

متن کامل

Error Bounds for Real Function Classes Based on Discretized Vapnik-Chervonenkis Dimensions

The Vapnik-Chervonenkis (VC) dimension plays an important role in statistical learning theory. In this paper, we propose the discretized VC dimension obtained by discretizing the range of a real function class. Then, we point out that Sauer’s Lemma is valid for the discretized VC dimension. We group the real function classes having the infinite VC dimension into four categories by using the dis...

متن کامل

PAC-Learning with General Class Noise Models

We introduce a framework for class noise, in which most of the known class noise models for the PAC setting can be formulated. Within this framework, we study properties of noise models that enable learning of concept classes of finite VC-dimension with the Empirical Risk Minimization (ERM) strategy. We introduce simple noise models for which classical ERM is not successful. Aiming at a more ge...

متن کامل

A Theoretical Framework for Deep Transfer Learning

We generalize the notion of PAC learning to include transfer learning. In our framework, the linkage between the source and the target tasks is a result of having the sample distribution of all classes drawn from the same distribution of distributions, and by restricting all source and a target concepts to belong to the same hypothesis subclass. We have two models: an adversary model and a rand...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Arch. Math. Log.

دوره 54  شماره 

صفحات  -

تاریخ انتشار 2015